Grouping by length makes training loss oscillate and makes evaluation loss worse

Has anyone noticed different behavior when using group by length vs not using it? Does the unbalanced length cause worse optimizer steps and come at the price of having more ineffective training?

1 Like

Had a similar issue, my training loss would spike at intervals of exactly 50 steps. I was going crazy until finally figured out that group_by_length may cause issues like mine

The solution is to set group_by_length=False

1 Like